VC Dimension of Sigmoidal and General Pfaffian Networks

نویسندگان

  • Marek Karpinski
  • Angus Macintyre
چکیده

We introduce a new method for proving explicit upper bounds on the VC Dimension of general functional basis networks, and prove as an application, for the rst time, that the VC Dimension of analog neural networks with the sigmoidal activation function (y) = 1=1+e ?y is bounded by a quadratic polynomial O((lm) 2) in both the number l of programmable parameters, and the number m of nodes. The proof method of this paper generalizes to much wider class of Pfaaan activation functions and formulas, and gives also for the rst time polynomial bounds on their VC Dimension. We present also some other applications of our method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Polynomial Bounds for VC Dimension of Sigmoidal and General Pfaffian Neural Networks

We introduce a new method for proving explicit upper bounds on the VC Dimension of general functional basis networks, and prove as an application, for the rst time, that the VC Dimension of analog neural networks with the sigmoidal activation function (y) = 1=1+e ?y is bounded by a quadratic polynomial O((lm) 2) in both the number l of programmable parameters, and the number m of nodes. The pro...

متن کامل

Neural Networks with Quadratic VC Dimension

This paper shows that neural networkswhich use continuousactivation functions have VC dimension at least as large as the square of the number of weights w. This results settles a long-standing open question, namely whether the well-known O(w log w) bound, known for hardthreshold nets, also held for more general sigmoidal nets. Implications for the number of samples needed for valid generalizati...

متن کامل

Polynomial Bounds for the VC-Dimension of Sigmoidal, Radial Basis Function, and Sigma-pi Networks

W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks including sigmoidal, radial basis functions, and sigma-pi networks, where h is the number of hidden units and W is the number of adjustable parameters, which extends Karpinski and Macintyre's resent results.* The class is characterized by polynomial input functions and activation functions that are sol...

متن کامل

Neural Networks with Quadratic Vc Dimension Produced as Part of the Esprit Working Group in Neural and Computational Learning, Neurocolt 8556 Submitted to Workshop on Neural Information Processing, Nips'95

This paper shows that neural networks which use continuous activation functions have VC dimension at least as large as the square of the number of weights w. This result settles a long-standing open question, namely whether the well-known O(w log w) bound, known for hard-threshold nets, also held for more general sigmoidal nets. Implications for the number of samples needed for valid generaliza...

متن کامل

Approximating the Volume of General Pfaffian Bodies

We introduce a new powerful method of approximating the volume (and integrals) for a vast number of geometric body classes defined by boolean combinations of Pfaffian conditions. The method depends on the VC Dimension of the underlying classes of bodies. The resulting approximation algorithms are quite different in spirit from the other up to now known methods, and give randomized solutions eve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Electronic Colloquium on Computational Complexity (ECCC)

دوره 2  شماره 

صفحات  -

تاریخ انتشار 1995